97 research outputs found

    A hybrid multiagent approach for global trajectory optimization

    Get PDF
    In this paper we consider a global optimization method for space trajectory design problems. The method, which actually aims at finding not only the global minimizer but a whole set of low-lying local minimizers(corresponding to a set of different design options), is based on a domain decomposition technique where each subdomain is evaluated through a procedure based on the evolution of a population of agents. The method is applied to two space trajectory design problems and compared with existing deterministic and stochastic global optimization methods

    Efficiency of evolutionary algorithms in water network pipe sizing

    Get PDF
    The pipe sizing of water networks via evolutionary algorithms is of great interest because it allows the selection of alternative economical solutions that meet a set of design requirements. However, available evolutionary methods are numerous, and methodologies to compare the performance of these methods beyond obtaining a minimal solution for a given problem are currently lacking. A methodology to compare algorithms based on an efficiency rate (E) is presented here and applied to the pipe-sizing problem of four medium-sized benchmark networks (Hanoi, New York Tunnel, GoYang and R-9 Joao Pessoa). E numerically determines the performance of a given algorithm while also considering the quality of the obtained solution and the required computational effort. From the wide range of available evolutionary algorithms, four algorithms were selected to implement the methodology: a PseudoGenetic Algorithm (PGA), Particle Swarm Optimization (PSO), a Harmony Search and a modified Shuffled Frog Leaping Algorithm (SFLA). After more than 500,000 simulations, a statistical analysis was performed based on the specific parameters each algorithm requires to operate, and finally, E was analyzed for each network and algorithm. The efficiency measure indicated that PGA is the most efficient algorithm for problems of greater complexity and that HS is the most efficient algorithm for less complex problems. However, the main contribution of this work is that the proposed efficiency ratio provides a neutral strategy to compare optimization algorithms and may be useful in the future to select the most appropriate algorithm for different types of optimization problems

    An investigation of latency prediction for NoC-based communication architectures using machine learning techniques

    Get PDF
    © 2019, Springer Science+Business Media, LLC, part of Springer Nature. Due to the increasing number of cores in Systems on Chip (SoCs), bus architectures have suffered with limitations regarding performance. As applications demand higher bandwidth and lower latencies, buses have not been able to comply with such requirements due to longer wires and increased capacitance. Facing this scenario, Networks on Chip (NoCs) emerged as a way to overcome the limitations found in bus-based systems. Fully exploring all possible NoC characteristics settings is unfeasible due to the vast design space to cover. Therefore, some methods which aim to speed up the design process are needed. In this work, we propose the use of machine learning techniques to optimise NoC architecture components during the design phase. We have investigated the performance of several different ML techniques and selected the Random Forest one targeting audio/video applications. The results have shown an accuracy of up to 90% and 85% for prediction involving arbitration and routing protocols, respectively, and in terms of applications inference, audio/video achieved up to 99%. After this step, we have evaluated other classifiers for each application individually, aiming at finding the adequate one for each situation. The best class of classifiers found was the Tree-based one (Random Forest, Random Tree, and M5P) which is very encouraging, and it points to different approaches from the current state of the art for NoCs latency prediction

    An integrated online radioassay data storage and analytics tool for nEXO

    Full text link
    Large-scale low-background detectors are increasingly used in rare-event searches as experimental collaborations push for enhanced sensitivity. However, building such detectors, in practice, creates an abundance of radioassay data especially during the conceptual phase of an experiment when hundreds of materials are screened for radiopurity. A tool is needed to manage and make use of the radioassay screening data to quantitatively assess detector design options. We have developed a Materials Database Application for the nEXO experiment to serve this purpose. This paper describes this database, explains how it functions, and discusses how it streamlines the design of the experiment

    E-commerce ethics and its impact on buyer repurchase intentions and loyalty: an empirical study of small and medium Egyptian businesses

    Get PDF
    The theoretical understanding of e-commerce has received much attention over the years; however, relatively little focus has been directed towards e-commerce ethics, especially the SMEs B2B e-commerce aspect. Therefore, the purpose of this paper is to develop and empirically test a framework that explains the impact of SMEs B2B e-commerce ethics on buyer repurchase intentions and loyalty. Using SEM to analyse the data collected from a sample of SME e-commerce firms in Egypt, the results indicate that buyers’ perceptions of supplier ethics construct is composed of six dimensions (security, non-deception, fulfilment/reliability, service recovery, shared value, and communication) and strongly predictive of online buyer repurchase intentions and loyalty. Furthermore, our results also show that reliability/fulfilment and non-deception are the most effective relationship-building dimensions. In addition, relationship quality has a positive effect on buyer repurchase intentions and loyalty. The results offer important implications for B2B e-commerce and are likely to stimulate further research in the area of relationship marketing

    Performance of novel VUV-sensitive Silicon Photo-Multipliers for nEXO

    Full text link
    Liquid xenon time projection chambers are promising detectors to search for neutrinoless double beta decay (0νββ\nu \beta \beta), due to their response uniformity, monolithic sensitive volume, scalability to large target masses, and suitability for extremely low background operations. The nEXO collaboration has designed a tonne-scale time projection chamber that aims to search for 0νββ\nu \beta \beta of \ce{^{136}Xe} with projected half-life sensitivity of 1.35×10281.35\times 10^{28}~yr. To reach this sensitivity, the design goal for nEXO is ≤\leq1\% energy resolution at the decay QQ-value (2458.07±0.312458.07\pm 0.31~keV). Reaching this resolution requires the efficient collection of both the ionization and scintillation produced in the detector. The nEXO design employs Silicon Photo-Multipliers (SiPMs) to detect the vacuum ultra-violet, 175 nm scintillation light of liquid xenon. This paper reports on the characterization of the newest vacuum ultra-violet sensitive Fondazione Bruno Kessler VUVHD3 SiPMs specifically designed for nEXO, as well as new measurements on new test samples of previously characterised Hamamatsu VUV4 Multi Pixel Photon Counters (MPPCs). Various SiPM and MPPC parameters, such as dark noise, gain, direct crosstalk, correlated avalanches and photon detection efficiency were measured as a function of the applied over voltage and wavelength at liquid xenon temperature (163~K). The results from this study are used to provide updated estimates of the achievable energy resolution at the decay QQ-value for the nEXO design
    • …
    corecore